Order Statistics Based Estimator for Renyi’s Entropy
نویسندگان
چکیده
Several types of entropy estimators exist in the information theory literature. Most of these estimators explicitly involve estimating the density of the available data samples before computing the entropy. However, the entropyestimator using sample spacing avoids this intermediate step and computes the entropy directly using the order-statistics. In this paper, we extend our horizon beyond Shannon’s definition of entropy and analyze the entropy estimation performance at higher orders of alpha, using Renyi’s generalized entropy estimator. We show that the estimators for higher orders of alpha better approximate the true entropy for an exponential family of distributions. Practical application of this estimator is demonstrated by computing mutual information between functionally coupled systems. During the estimation process, the joint distributions are decomposed into sum of their marginals by using linear ICA.
منابع مشابه
Blind source separation using Renyi's -marginal entropies
We have recently suggested the minimization of a nonparametric estimator of Renyi’s mutual information as a criterion for blind source separation. Using a two-stage topology, consisting of spatial whitening and a series of Givens rotations, the cost function reduces to the sum of marginal entropies, just like in the Shannon’s entropy case. Since we use a Parzen window density estimator and elim...
متن کاملEntropy Minimization Algorithm for Multilayer Perceptrons
We have previously proposed the use of quadratic Renyi’s error entropy with a Parzen density estimator with Gaussian kernels as an alternative optimality criterion for supervised neural network training, and showed that it produces better performance on the test data compared to the MSE. The error entropy criterion imposes the minimization of average information content in the error signal rath...
متن کاملAn analysis of entropy estimators for blind source separation
An extensive analysis of a non-parametric, information-theoretic method for instantaneous blind source separation (BSS) is presented. As a result a modified stochastic information gradient estimator is proposed to reduce the computational complexity and to allow the separation of sub-Gaussian sources. Interestingly, the modification enables the method to simultaneously exploit spatial and spect...
متن کاملInformation Theoretic Learning
of Dissertation Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy INFORMATION THEORETIC LEARNING: RENYI’S ENTROPY AND ITS APPLICATIONS TO ADAPTIVE SYSTEM TRAINING By Deniz Erdogmus May 2002 Chairman: Dr. Jose C. Principe Major Department: Electrical and Computer Engineering Traditionally, second-order ...
متن کاملInformation-Theoretic Learning Using Renyi’s Quadratic Entropy
Learning from examples has been traditionally based on correlation or on the mean square error (MSE) criterion, in spite of the fact that learning is intrinsically related with the extraction of information from examples. The problem is that Shannon’s Information Entropy, which has a sound theoretical foundation, is not easy to implement in a learning from examples scenario. In this paper, Reny...
متن کامل